home *** CD-ROM | disk | FTP | other *** search
- Fast Training Program;
-
- 1. Purpose;
- a. Initialize a MLP using random initial weights
- b. Train a MLP network using a method much faster than BP
-
- 2. Features;
- a. Uses a batching approach, so the order of training
- patterns is unimportant
- b. Has adaptation of learning factor
- c. Shows training MSE and error percentages
- d. Does not save weights to the disk, in the demo version
-
- 3. Example Run of Fast Training Program
- a. Go to the "Batch Processing" option and press <ret>
- b. Observe the parameter file with commented keyboard responses;
-
- 10, .0005 ! Enter number of iterations, MSE threshold
- 2 ! Enter 1 for old weights, 2 to initialize with random weights
- gls.top ! file storing network structure
- gls ! filename for training data
- 1 ! 1 if the data file contains desired outputs, 2 else
- 0 ! Enter number of patterns to read (0 for all training patterns)
- 0, 0 ! Enter numbers of 1st and last patterns to examine (0 0 for none)
- .03 ! learning factor
- gls1.wts ! filename for saving the trained weights
- 4 ! 1 to continue training, 2 to start new network, 3 for a new data file, 4 to stop
-
-
- The program will read all patterns from the file gls, and train a MLP
- using the network structure file gls.top, which is shown below.
-
- 4
- 4 5 2 1
- 1 1 1
-
- The network will have 4 layers including 4 inputs, 7 hidden units
- divided between 2 hidden layers, and 1 output. In addition, layers 2,
- 3, and 4 connect to all previous layers. Training will stop
- after 10 iterations, or when the MSE % reaches .0005 . The final
- network weights will be stored in the file gls1.wts.
- c. Exit the DOS editor and observe the program running
- d. Go to the "Examine Program Output" option and press <ret>
- e. You can run this program on your own data, simply by editing the
- parameter file in the "batch Run" option.
-